A numerical comparison of Hessian update techniques for an SQCQP method
نویسنده
چکیده
This short note considers Hessian updates for a sequential quadratically constrained quadratic programming (SQCQP) method. The SQCQP method is an iterative method to solve an inequality constrained optimization problem. At each iteration, the SQCQP method solves a quadratically constrained quadratic programming subproblem whose objective function and constraints are approximations to the objective function and constraints of the original problem. We typically use Hessian update to construct this subproblem and Hessian update has an important role in establishing global convergence of the SQCQP method. In this short note, we introduce some different Hessian updates and apply them to the feasible SQCQP method by Solodov [12]. The presented update methods are the Hessian matrix, Levenberg Marquardt type modification, the modified BFGS method by Powell [10], the cautious BFGS proposed by Li and Fukushima [8] and the limited-memory BFGS proposed by Wei, Yu, Yuan and Lian [15]. We present the algorithms which integrates the feasible SQCQP method and these Hessian updates. Moreover, we examine the numerical behavior of these algorithms.
منابع مشابه
Superlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملUsing an Efficient Penalty Method for Solving Linear Least Square Problem with Nonlinear Constraints
In this paper, we use a penalty method for solving the linear least squares problem with nonlinear constraints. In each iteration of penalty methods for solving the problem, the calculation of projected Hessian matrix is required. Given that the objective function is linear least squares, projected Hessian matrix of the penalty function consists of two parts that the exact amount of a part of i...
متن کاملAn Efficient Reduced-order Method for Hessian Matrix Construction
When nonlinear behavior must be considered in sensitivity analysis studies, one needs to approximate higher order derivatives of the response of interest with respect to all input data. This paper presents an application of a general reduced order method to constructing higher order derivatives of response of interest with respect to all input data. In particular, we apply the method to constru...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملAn Explicit Quasi-Newton Update for Sparse Optimization Calculations
A new quasi-Newton updating formula for sparse optimization calculations is presented. It makes combined use of a simple strategy for fixing symmetry and a Schubert correction to the upper triangle of a permuted Hessian approximation. Interesting properties of this new update are that it is closed form and that it does not satisfy the secant condition at every iteration of the calculations. Som...
متن کامل